Automatic Selection of Parameters in Spline Regression via Kullback-Leibler Information
نویسنده
چکیده
Based on Kullback-Leibler information we propose a data-driven selector, called GAIC (c) , for choosing parameters of regression splines in nonparametric regression via a stepwise forward/backward knot placement and deletion strategy 1]. This criterion uniies the commonly used information criteria and includes the Akaike information criterion (AIC) 2] and the corrected Akaike information criterion (AICC) 3] as special cases. To show the performance of GAIC (c) for c = 1=2, 3=4, 7=8, and 15=16, we compare it with cross-validation (CV), the generalized cross-validation (GCV), AIC, and AICC by an extensive simulation. Applications to the selection of the penalty parameters of smoothing splines are also discussed. Our simulation results indicate that the information criteria work well and are superior to cross-validation-based criteria in most of the cases considered, particularly in small sample cases. Under certain mild conditions, GAIC (c) is shown to be asymptotically optimal in choosing the number of equispaced and random knots of regression splines.
منابع مشابه
Comparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil
In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...
متن کاملInformation Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملVariable Selection with Akaike Information Criteria : a Comparative Study
In this paper, the problem of variable selection in linear regression is considered. This problem involves choosing the most appropriate model from the candidate models. Variable selection criteria based on estimates of the Kullback-Leibler information are most common. Akaike’s AIC and bias corrected AIC belong to this group of criteria. The reduction of the bias in estimating the Kullback-Leib...
متن کاملMarkov-switching model selection using Kullback–Leibler divergence
In Markov-switching regression models, we use Kullback–Leibler (KL) divergence between the true and candidate models to select the number of states and variables simultaneously. Specifically, we derive a new information criterion, Markov switching criterion (MSC), which is an estimate of KL divergence. MSC imposes an appropriate penalty to mitigate the overretention of states in the Markov chai...
متن کاملFrailty Model with Spline Estimated Nonparametric Hazard Function
Frailty has been introduced as a group-wise random effect to describe the within-group dependence for correlated survival data. In this article, we propose a penalized joint likelihood method for nonparametric estimation of hazard function. With the proposed method, the frailty variance component and the smoothing parameters become the tuning parameters that are selected to minimize a loss func...
متن کامل